Quasi-Newton Methods: A New Direction

نویسندگان

  • Philipp Hennig
  • Martin Kiefel
چکیده

Four decades after their invention, quasiNewton methods are still state of the art in unconstrained numerical optimization. Although not usually interpreted thus, these are learning algorithms that fit a local quadratic approximation to the objective function. We show that many, including the most popular, quasi-Newton methods can be interpreted as approximations of Bayesian linear regression under varying prior assumptions. This new notion elucidates some shortcomings of classical algorithms, and lights the way to a novel nonparametric quasi-Newton method, which is able to make more efficient use of available information at computational cost similar to its predecessors.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

New hybrid conjugate gradient method for unconstrained optimization

Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. Most of conjugate gradient methods don’t always generate a descent search direction, so the descent condition is usually assumed in the analyses and implementations. Dai and Yuan (1999) proposed the conjugate gradient method which generates a descent direction at every iteration. Yabe and...

متن کامل

Planar Quasi - Newton Algorithms for Unconstrained Saddlepoint Problems

A new class of quasi-Newton methods is introduced that can locate a unique stationary point of an n-dimensional quadratic function in at most n steps. When applied to positive-definite or negative-definite quadratic functions, the new class is identical to Huang's symmetric family of quasi-Newton methods (Ref. 1). Unlike the latter, however, the new family can handle indefinite quadratic forms ...

متن کامل

A combined conjugate-gradient quasi-Newton minimization algorithm

Although quasi-Newton algorithms generally converge in fewer iterations than conjugate gradient algorithms, they have the disadvantage of requiring substantially more storage. An algorithm will be described which uses an intermediate (and variable) amount of storage and which demonstrates convergence which is also intermediate, that is, generally better than that observed for conjugate gradient...

متن کامل

Extra multistep BFGS updates in quasi-Newton methods

This note focuses on developing quasi-Newton methods that combine m+ 1 multistep and single-step updates on a single iteration for the sake of constructing the new approximation to the Hessian matrix to be used on the next iteration in computing the search direction. The approach considered here exploits the merits of the multistepmethods and those of El-Baali (1999) to create a hybrid techniqu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 14  شماره 

صفحات  -

تاریخ انتشار 2012